136 research outputs found

    CFT Duals for Extreme Black Holes

    Get PDF
    It is argued that the general four-dimensional extremal Kerr-Newman-AdS-dS black hole is holographically dual to a (chiral half of a) two-dimensional CFT, generalizing an argument given recently for the special case of extremal Kerr. Specifically, the asymptotic symmetries of the near-horizon region of the general extremal black hole are shown to be generated by a Virasoro algebra. Semiclassical formulae are derived for the central charge and temperature of the dual CFT as functions of the cosmological constant, Newton's constant and the black hole charges and spin. We then show, assuming the Cardy formula, that the microscopic entropy of the dual CFT precisely reproduces the macroscopic Bekenstein-Hawking area law. This CFT description becomes singular in the extreme Reissner-Nordstrom limit where the black hole has no spin. At this point a second dual CFT description is proposed in which the global part of the U(1) gauge symmetry is promoted to a Virasoro algebra. This second description is also found to reproduce the area law. Various further generalizations including higher dimensions are discussed.Comment: 18 pages; v2 minor change

    An almost sure limit theorem for super-Brownian motion

    Get PDF
    We establish an almost sure scaling limit theorem for super-Brownian motion on Rd\mathbb{R}^d associated with the semi-linear equation ut=1/2Δu+βuαu2u_t = {1/2}\Delta u +\beta u-\alpha u^2, where α\alpha and β\beta are positive constants. In this case, the spectral theoretical assumptions that required in Chen et al (2008) are not satisfied. An example is given to show that the main results also hold for some sub-domains in Rd\mathbb{R}^d.Comment: 14 page

    Towards a Universal Theory of Artificial Intelligence based on Algorithmic Probability and Sequential Decision Theory

    Get PDF
    Decision theory formally solves the problem of rational agents in uncertain worlds if the true environmental probability distribution is known. Solomonoff's theory of universal induction formally solves the problem of sequence prediction for unknown distribution. We unify both theories and give strong arguments that the resulting universal AIXI model behaves optimal in any computable environment. The major drawback of the AIXI model is that it is uncomputable. To overcome this problem, we construct a modified algorithm AIXI^tl, which is still superior to any other time t and space l bounded agent. The computation time of AIXI^tl is of the order t x 2^l.Comment: 8 two-column pages, latex2e, 1 figure, submitted to ijca

    Numerical search for a fundamental theory

    Full text link
    We propose a numerical test of fundamental physics based on the complexity measure of a general set of functions, which is directly related to the Kolmogorov (or algorithmic) complexity studied in mathematics and computer science. The analysis can be carried out for any scientific experiment and might lead to a better understanding of the underlying theory. From a cosmological perspective, the anthropic description of fundamental constants can be explicitly tested by our procedure. We perform a simple numerical search by analyzing two fundamental constants: the weak coupling constant and the Weinberg angle, and find that their values are rather atypical.Comment: 6 pages, 3 figures, RevTeX, expansion and clarification, references adde

    Organization of complex networks without multiple connections

    Full text link
    We find a new structural feature of equilibrium complex random networks without multiple and self-connections. We show that if the number of connections is sufficiently high, these networks contain a core of highly interconnected vertices. The number of vertices in this core varies in the range between constN1/2const N^{1/2} and constN2/3const N^{2/3}, where NN is the number of vertices in a network. At the birth point of the core, we obtain the size-dependent cut-off of the distribution of the number of connections and find that its position differs from earlier estimates.Comment: 5 pages, 2 figure

    Probabilistic models of information retrieval based on measuring the divergence from randomness

    Get PDF
    We introduce and create a framework for deriving probabilistic models of Information Retrieval. The models are nonparametric models of IR obtained in the language model approach. We derive term-weighting models by measuring the divergence of the actual term distribution from that obtained under a random process. Among the random processes we study the binomial distribution and Bose--Einstein statistics. We define two types of term frequency normalization for tuning term weights in the document--query matching process. The first normalization assumes that documents have the same length and measures the information gain with the observed term once it has been accepted as a good descriptor of the observed document. The second normalization is related to the document length and to other statistics. These two normalization methods are applied to the basic models in succession to obtain weighting formulae. Results show that our framework produces different nonparametric models forming baseline alternatives to the standard tf-idf model

    Towards a universal theory of artificial intelligence based on algorithmic probability and sequential decisions

    No full text
    Decision theory formally solves the problem of rational agents in uncertain worlds if the true environmental probability distribution is known. Solomonoff’s theory of universal induction formally solves the problem of sequence prediction for unknown distributions. We unify both theories and give strong arguments that the resulting universal AIξ model behaves optimally in any computable environment. The major drawback of the AIξ model is that it is uncomputable. To overcome this problem, we construct a modified algorithm AIξ, which is still superior to any other time t and length l bounded agent. The computation time of AIξtl is of the order t·2 l.This work was supported by SNF grant 2000-61847.00 to Jürgen Schmidhuber

    Synthesizing Program Input Grammars

    Full text link
    We present an algorithm for synthesizing a context-free grammar encoding the language of valid program inputs from a set of input examples and blackbox access to the program. Our algorithm addresses shortcomings of existing grammar inference algorithms, which both severely overgeneralize and are prohibitively slow. Our implementation, GLADE, leverages the grammar synthesized by our algorithm to fuzz test programs with structured inputs. We show that GLADE substantially increases the incremental coverage on valid inputs compared to two baseline fuzzers

    MDL Convergence Speed for Bernoulli Sequences

    Get PDF
    The Minimum Description Length principle for online sequence estimation/prediction in a proper learning setup is studied. If the underlying model class is discrete, then the total expected square loss is a particularly interesting performance measure: (a) this quantity is finitely bounded, implying convergence with probability one, and (b) it additionally specifies the convergence speed. For MDL, in general one can only have loss bounds which are finite but exponentially larger than those for Bayes mixtures. We show that this is even the case if the model class contains only Bernoulli distributions. We derive a new upper bound on the prediction error for countable Bernoulli classes. This implies a small bound (comparable to the one for Bayes mixtures) for certain important model classes. We discuss the application to Machine Learning tasks such as classification and hypothesis testing, and generalization to countable classes of i.i.d. models.Comment: 28 page

    Communication Capacity of Quantum Computation

    Get PDF
    By considering quantum computation as a communication process, we relate its efficiency to a communication capacity. This formalism allows us to rederive lower bounds on the complexity of search algorithms. It also enables us to link the mixedness of a quantum computer to its efficiency. We discuss the implications of our results for quantum measurement.Comment: 4 pages, revte
    corecore